Overview
What is Informatica PowerCenter?
Informatica PowerCenter is a metadata driven data integration technology designed to form the foundation for data integration initiatives, including analytics and data warehousing, application migration, or consolidation and data governance.
Informatica PowerExchange Connectors: the data with quality and reliability for the enterprise
Old yet powerful ETL Tool
Mainly …
Amazing ETL Tool
The data integration leader
Informatica Power Center review
PowerCenter outpowers the cloud version
Informatica PowerCenter, an Enterprise ETL Tool
A powerful ETL solution which focuses on enterprise scalability, flexibility, and code re-usability
Sheer Performance (but not for MacOS!)
Licensing flexibility and skills availability in the market
Informatica Review
Good ETL Platform
PowerCenter works well with large, structured data files
My personal view of Informatica Enterprise Data Integration tools based on my 10+ years of user experiences.
Day to day user's opinion of PowerCenter
Awards
Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards
Popular Features
- Connect to traditional data sources (18)9.090%
- Business rules and workflow (18)9.090%
- Simple transformations (18)8.080%
- Complex transformations (18)7.070%
Pricing
What is Informatica PowerCenter?
Informatica PowerCenter is a metadata driven data integration technology designed to form the foundation for data integration initiatives, including analytics and data warehousing, application migration, or consolidation and data governance.
Entry-level set up fee?
- No setup fee
Offerings
- Free Trial
- Free/Freemium Version
- Premium Consulting/Integration Services
Would you like us to let the vendor know that you want pricing?
94 people also want pricing
Alternatives Pricing
What is Clear Analytics?
Clear Analytics is a business intelligence solution that enables non technical end users to perform analytics by leveraging existing knowledge of Excel coupled with a built in query builder. Some key features include: Dynamic Data Refresh, Data Share and In-Excel Collaboration.
What is Vertify?
VertifyData is a cloud-based integration platform with core integration capacities, including a drag-and-drop interface and real-time synchronization. It also offers over 80 prebuilt connectors and templates, plus customizable integrations for scaling businesses.
Product Demos
#Informatica #Source #Qualifier Online #Tutorial For Beginners - Part 14
#Informatica #Client Tools #Training Session For Beginners- Part 4 || Video Course
INFORMATICA ONLINE TRAINING Session | Free Tutorials | Materials | ETL Tool
Features
Data Source Connection
Ability to connect to multiple data sources
- 9Connect to traditional data sources(18) Ratings
Ability to connect to traditional data sources like relational databases, flat files, XML files and packaged applications
- 8Connecto to Big Data and NoSQL(14) Ratings
Ability to connect to non-traditional data sources like Hadoop and other big data technologies, and NoSQL databases
Data Transformations
Data transformations include calculations, search and replace, data normalization and data parsing
- 8Simple transformations(18) Ratings
Simple data transformations are calculations, data type conversions, aggregations and search and replace operations
- 7Complex transformations(18) Ratings
Complex data transformations are data normalization, advanced data parsing, etc.
Data Modeling
A data model is a diagram or flowchart that illustrates the relationships between data
- 9Data model creation(15) Ratings
Ability to create and maintain data models using a graphical tool to define relationships between data
- 8Metadata management(16) Ratings
Automated discovery of metadata with ability to synchronize and share metadata with other tools like Master Data Management
- 9Business rules and workflow(18) Ratings
Ability to define and manage business rules and workflows
- 6.1Collaboration(16) Ratings
Collaboration is enabled by a shared repository of project information and metadata
- 9Testing and debugging(17) Ratings
Tool to debug and tune for optimal performance
Data Governance
Data governance is the practise of implementing policies defining effective use of an organization's data assets
- 9Integration with data quality tools(15) Ratings
Integration with tools for cleansing, parsing and normalizing data according to business rules
- 9Integration with MDM tools(13) Ratings
Integration with master data management tools to ensure data consistency across the organization
Product Details
- About
- Tech Details
- FAQs
What is Informatica PowerCenter?
Informatica PowerCenter Technical Details
Operating Systems | Unspecified |
---|---|
Mobile Application | No |
Frequently Asked Questions
Comparisons
Compare with
Reviews and Ratings
(92)Community Insights
- Business Problems Solved
- Pros
- Cons
- Recommendations
PowerCenter is widely used across organizations for a variety of use cases. Users have found the product to be instrumental in addressing their data integration needs and solving challenges related to data quality management, master data management, data masking, and data virtualization. With PowerCenter, users can easily extract, transform, and load data from various sources such as SAP and Salesforce into their data warehouses and data marts. They appreciate how the product optimizes data for reporting tools and facilitates the generation of accurate reports.
Another common use case for PowerCenter is integrating data from disparate sources and migrating data from legacy systems to newer infrastructure. Users have found the product to be efficient in automating ETL processes and ensuring timely and efficient data loads. They also value its advanced security features, which make administration easier. Many users have praised PowerCenter's ease of use, fast development capabilities, and multi-user development environment with a check-out/check-in mechanism that supports collaborative work.
Overall, PowerCenter has become a mainstay ETL tool in organizations due to its scalability, flexibility, and robustness as an ETL engine. From loading data into data warehouses and data marts to feeding information into COTs applications or Teradata Data Warehouses, PowerCenter has proven to be an indispensable tool for enterprise-wide data integration across a range of industries including marketing, healthcare, finance, and more.
Seamless data integration capabilities: Multiple reviewers have praised Informatica PowerCenter for its seamless data integration capabilities, with users stating that it effortlessly connects to multiple sources and targets. This feature simplifies the complex process of integrating data from various systems, providing a significant advantage in terms of efficiency and productivity.
User-friendly interface: The user-friendly interface of Informatica PowerCenter has been consistently appreciated by reviewers, who find it highly intuitive and easy to navigate. Users state that the application of complicated business rule logic is straightforward, thanks to the intuitive design process for creating ETL mappings and workflows. The minimal learning time and effort required to work with PowerCenter's interface are seen as significant advantages.
Reusability and streamlined development processes: Reviewers have highlighted PowerCenter's ability to capture and share logic in mapplets, enabling reusability and streamlining development processes. This feature contributes to a seamless workflow for developers, making the development and maintenance of code in PowerCenter straightforward. Users value this functionality as it enhances efficiency and promotes consistency across projects.
Lack of Documentation: Several users have expressed frustration with the limited availability and insufficient documentation for Informatica PowerCenter. This has made it challenging for users to achieve advanced tasks and work with complex workflows, hindering their ability to fully utilize the software.
Complex Installation and Configuration: Many reviewers have encountered difficulties during the installation and configuration process of Informatica PowerCenter. The software consists of multiple components that require significant time and resources to set up effectively, causing inconvenience for users.
Limited Integration Capabilities: Some users have found it difficult to integrate code from other languages like Java, Python, or R into Informatica PowerCenter. This limitation forces users to build multiple hops in a data pipeline, resulting in additional complexity and potential inefficiencies in their workflow.
Users recommend Informatica PowerCenter for various tasks such as data migration, ETL, and data integration. They find it to be a reliable tool with strong support and powerful mapping and visualization capabilities. Additionally, users mention its ability to connect and fetch data from different source systems, as well as its capabilities for processing and transforming data. Informatica PowerCenter is highly recommended for companies dealing with large amounts of data and those in need of real-time data integration. Users also mention its plugins for integrating with other applications. While some users acknowledge that Informatica PowerCenter is more expensive than other software, they still recommend it as the best ETL tool in the market.
Attribute Ratings
Reviews
(1-21 of 21)Informatica PowerExchange Connectors: the data with quality and reliability for the enterprise
- Access to different RDB, API's Data (External / Internal)
- The connectors access to data from social networks such as Twitter or LinkedIn
- Connectors provide access to data in AS400 systems with a minimun development of interface
- The dashboard is is completely customizable
- for any organization
- Define customer business rulers to put the data in the repository with easy develpoment.
- Quickly transfer and integrate data between cloud and local RDB/SYSTEMS
- Very simple to navigate between data before and after integration/transformation was made
2.- Bank and finance enviroment to integrate differente data form trading, Regulatory reports, decisions makers, fraud and financial crimes because in this kind of scenary the quality of data is the base of the business.
3.- Departments of development and test of applications in enterprises because you can design enviroments, out of the production systems, to development and test the new API's or updateds made.
Old yet powerful ETL Tool
Mainly used to process semi-structured and unstructured data like web and mobile logs. It has been well tested against large volume of data and it stood up to the expectations.
- Informatica Powercenter is an innovative software that works with ETL-type data integration. Connectivity to almost all the database systems.
- Great documentation and customer support.
- It has a various solution to address data quality issues. data masking, data virtualization. It has various supporting tools or MDM, IDQ, Analyst, BigData which can be used to analyze data and correct it.
- It doesn't have much scheduling options and the tool is not very much capable if we schedule more jobs.
- Debugging the workflows and mappings are very hard with Informatica PowerCenter.
- Lookup transformation on large tables consumes more Memory and CPU.
Amazing ETL Tool
- Extracting, transforming and loading data to and from different databases.
- Development, testing and maintenance of code is fairly simple.
- Good performance even during processing of large amount of data.
- Amazing ability to connect with different types of sources and targets available in the market.
- Performance with ODBC drivers is comparatively slow which ends up in taking a lot of time.
- Deployment is bit complex.
- Clobs and Blobs datatypes are very difficult to handle.
The data integration leader
- Data integration - Informatica has always been a leader in the data integration space. The company has maintained its position by innovative solutions, keeping pace with technology development, providing connectors to most of the common platforms, and improving on services with time. It has the ability to integrate data from multi-cloud, hybrid, on-premise infrastructure setup. The data integration can happen in batch or in real time. The performance of Informatica data integration is among the best in class.
- Data migration - Informatica is widely used as a data migration tool. Lots of enterprises run their software on legacy infrastructure and at some point run into limitations associated with them. The large databases they use at that time make it extremely difficult to switch to the latest infrastructure and have a list of challenges. Informatica is great at addressing these challenges. It allows developers to create rule-based workflows that can be used to migrate any amount of data from old to the new infrastructure. Since Informatica is so successful at this, there is always use cases for reference and trained resources for executing a project.
- Application Integration - The corporate world is full of mergers and acquisitions which lead to one company's IT assets being moved/merged to another one. This requires a huge amount of work to homogenize the two systems so they can co-exist without breaking each other. Informatica is great at building pipelines to integrate many disparate applications.
- Data warehousing - Informatica is very commonly used for building data warehouse systems to fulfill the needs of enterprises and data marts to service requirements of individual requirements. This helps organizations' ability to use data sets for driving decision making. For example, Sales teams can decide how their previous sales efforts have performed in various geographies by analyzing the data warehouse data for sales.
- Tool Management - Informatica over time has become a behemoth of data integration and warehousing world. It has built a vast array of tools to address various user needs. This does not bode well for the future looking at all the newer technologies which do not have so much of tech burden. Most new tools have a great cloud version where you can hop onto a URL, do your work and deploy it in minutes. With Informatica, you still have multiple client tools just to be able to deploy a single workflow and monitor as it runs. This can be both confusing and overwhelming to users.
- Only commercial data integration - There is no open-source version of Informatica PowerCenter so it is not the most useful for small enterprises or individuals that wish to use it but can't afford the maintenance fee that can be quite a burden. Such users head over to open-source competitors that provide data integration services.
- Lack of integration with other technologies - It is not very easy to blend in code from other languages like Java, Python, R, etc. Most of the tools these days provide such functionality making lives of users easier. Lack of such capability may cause users to build multiple hops in a data pipeline.
- Inbuilt reporting - Although Informatica has been around for the longest time, it has not made the best use of all the data capabilities it has. With the amount of data flowing through power center, it should have been easy to provide some sort of reporting features that add immense value to a user's work. However, Informatica has not made great use of this opportunity.
Informatica Power Center review
Advanced security features also help you to perform administration more easily and with confidence.
- Transformation speed.
- Supports a multi-User development environment.
- A fast learning curve.
- It needs additional hardware.
- Lots of windows sometimes reduce your focus.
- It would be better to have a more compact design with the ability to aim toward specific topics.
When data is loaded to the Informatica Application Server, transformations perform very fast. It also supports push down ETL if your ETL mappings are simple or have average complexity.
PowerCenter outpowers the cloud version
- The number of connectors, and the coverage for all the possibilities, are excellent, giving us the comfort that we won’t have to build anything custom.
- The flow visualizations are very nice, as we often have to go in and forensically review work done by others who are no longer with the company. This helps us accurately identify source data.
- The alert system is a great feature I think is often underutilized. We’ve been able to set up a great communication tree of notifications when a job fails, saving our DBA’s a lot of time sending out emergency communications.
- The pricing on the connectors is inflated, in my opinion. Plus, the fact you have to pay for them twice if you also utilize IICS (Informatica Cloud). I think you should only have to pay once, regardless of what platform from the vendor you are using. This is a big Con for me, as a manager that has to budget for these things.
- It requires more in-depth training than some of the other ETL tools we use, including Informatica Cloud. With Cloud, you can guess and be right 99% of the time. With PowerCenter, you need a few weeks of training, and then still have to ask an expert.
- Setting up secure agents is always difficult, but I’ve seen it done better with other tools like MULE. With Informatica, you tend to need 12 people in a room from all over the business to troubleshoot why the connection is not working.
Informatica PowerCenter, an Enterprise ETL Tool
- Informatica is a proven enterprise ETL tool that scales and provides a graphical interface to code your data mappings.
- Reading, Transforming and Loading Data to and from databases.
- Process incoming data such as messages in real-time.
- Data Cleansing.
- Web Services / Services.
- Modeling your data models.
Informatica PowerCenter doesn't provide native capabilities for data cleansing, such as USPS zip code validation. You must code this manually.
A powerful ETL solution which focuses on enterprise scalability, flexibility, and code re-usability
- Enforces enterprise wide ETL development standards.
- Provides code re-usability with shared connections and objects.
- Particularly adept at integrating a wide range of disparate data sources (handles flat files particularly well).
- Well suited for moving large amounts of data.
- There are too many ways to perform the same or similar functions which in turn makes it challenging to trace what a workflow is doing and at which point (ex. sessions can be designed as static or re-usable and the override can occur at the session or workflow, or both which can be counter productive and confusing when troubleshooting).
- The power in structured design is a double edged sword. Simple tasks for a POC can become cumbersome. Ex. if you want to move some data to test a process, you first have to create your sources by importing them which means an ODBC connection or similar will need to be configured, you in turn have to develop your targets and all of the essential building blocks before being able to begin actual development. While I am on sources and targets, I think of a table definition as just that and find it counter intuitive to have to design a table as both a source and target and manage them as different objects. It would be more intuitive to have a table definition and its source/target properties defined by where you drag and drop it in the mapping.
- There are no checkpoints or data viewer type functions without designing an entire mapping and workflow. If you would like to simply run a job up to a point and check the throughput, an entire mapping needs to be completed and you would workaround this by creating a flat file target.
For small projects or even smaller development teams with mostly a single data source, expect frustration with being able to quickly test a solution as the design flow is very structured. It is also designed in a way that segregation of duties at a very high level can also cause small development teams to be counter-productive. Each step in the design process is a separate application, and although stitched together, is not without its problems. In order to design a simple mapping for example, you would first need a connection established to the source (example, ODBC) and keep in mind that it will automatically name the container according to how you named your connection. You would then open the designer tool, import a connection as a source, optionally check it in, create a target, optionally check it in as well, and design a transformation mapping. In order to test or run it, you will need to open a separate application (Workflow Manager) and create a workflow from your mapping, then create a session for that workflow and a workflow for those one or more sessions at which point you can test it. After running it, in order to observe, you then need to open a separate application (Monitor) to see what it is doing and how well. For a developer coming from something like SSIS, this can be daunting and cumbersome for building a simple POC and trying to test it (although from the inverse, building an enterprise scalable ETL solution from SSIS is its own challenge).
Sheer Performance (but not for MacOS!)
- Informatica has a wide range of support for databases. Pretty much every mainstream DBMS is compatible here.
- Designing ETL mappings and workflows is a very intuitive process, and takes minimal learning time and effort even for a beginner.
- Informatica's biggest strength is its sheer performance. It is unmatched in terms of handling large volumes of data.
- Setting up Informatica and integrating with your existing services can be a hassle. While the support team is quite helpful, it can still take a considerable amount of time and effort to get Informatica up and running.
- Informatica Enterprise Data Integration unavailable for MacOS. This can be a huge problem for a business that primarily uses Apple machines.
- Using Informatica PowerCenter for ETL designing can be quite intuitive for basic to moderately complex workflows. However, for achieving advanced tasks, there is not sufficient documentation available.
On the other hand, if your business primarily works on Apple machines, Informatica Enterprise Data Integration is a resounding NO. Running Informatica on a VM or a Remote Desktop completely kills its efficiency.
- Multi platform connectivity, including Hadoop and Cloud providers
- Deployment and Licensing flexibility
- Pricing
- Skills availability in the market
- Deployment is flexible, but deployment needs to be more simpler
- Difficult ingesting very large datasets, though Informatica's Big Data Edition makes it simple (but that comes at an additional cost)
Informatica Review
- Shared objects allow transformations, sources, and targets to be reused.
- Mapplets allow for blocks of transformations to be reused.
- Field by field transformations.
- Union and join a variety of data sources.
- UI is a little ugly - looks like pictures of the tool from 10+ years ago.
- Needs undo. Really, it doesn't have undo? Yes. Really.
Good ETL Platform
- It is quite flexible to handle different type of data formats like DB objects or Tables
- Good performance when processing lots of data in batch
- Easy to learn and use
- The client is quite heavy in terms of size and function
- Better way to upgrade
- Test and deployment automation needs to be improved
PowerCenter works well with large, structured data files
- PowerCenter processes input files, performs specified transformations, and maps the input data format to the output data format very quickly. The PowerCenter backend implementation seems to be optimized to process and map structured input records to structured output records and load the records into a database. One of the strengths of PowerCenter is performance of processing petabytes of structured input data files.
- PowerCenter does not require a software development experience or education. After providing initial hands-on training, the data consultants (who are statisticians, subject matter experts) in our organization were able to implement data ingest and data transformation tasks fairly easily.
- PowerCenter supports multiple DBMS technologies (for example, Oracle, Netezza). This flexibility allows it to be used by multiple departments within our organization.
- One of the challenges of PowerCenter is the lack of integration between the components and functionality provided by PowerCenter. PowerCenter consists of multiple components such has the repository service, integration service, metadata service. Considerable time and resources were required to install and configure these components before PowerCenter was available for use.
- In order to connect to various data sources such as Netezza database or SAS datasets, PowerCenter requires the installation and configuration of separate plug-ins. We spent considerable time trouble-shooting and debugging problems while trying to get the various plug-ins integrated with PowerCenter and get them up and running as described in the documentation.
- PowerCenter works well with structured data. That is, it is easy to work with input and output data that is pre-defined, fixed, and unchanging. It is much more difficult to work with dynamic data in which new fields are added or removed ad-hoc or if data format changes during the data ingest process. We have not been as successful in using PowerCenter for dynamic data.
- One of the challenges of learning PowerCenter is that it is difficult to find documentation or publications that help you learn the various details about PowerCenter software. Unlike SAS Institute, Informatica does not publish books about PowerCenter. The documentation available with PowerCenter is sparse; we have learned many aspects of this technology through trial and error.
PowerCenter is well suited for processing of large amounts of data that is structured and pre-defined. It is well-suited for large organizations that have the resources to install, configure and support PowerCenter. It is well suited for large organizations that have a large number of data consultants/analysts that do not have a software development/programming background.
PowerCenter is not a good fit for smaller, agile organizations that work with unstructured data and changing/dynamic data.
My personal view of Informatica Enterprise Data Integration tools based on my 10+ years of user experiences.
- Informatica Power Center tools are very effective on extracting, transforming and loading bulk data such as hundreds of millions of database rows or raw data flat files, XML files etc., out of or into all kinds of platforms such as databases (Oracle, DB2, SQL server, MySQL, any database that you can name it), OS (UNIX, Linux, Windows), web services, and real time such as IBM MQ.
- A single toolset to be used for all kinds of platforms, for bulk data loads, for real-time loads, or web services.
- The Informatica web service interface can be improved from a performance point of view and also from a troubleshooting point of view.
Day to day user's opinion of PowerCenter
- The tool is excellent at pulling data from source systems, modifying it to fit a target system, and then pushing the data accordingly.
- It handles conversion of datatypes very well
- PowerCenter is very adapt at applying data filters and 'business rules' to source data before pushing the results to the end user.
- As a developer/integrator, i feel that PowerCenter could use improvement, and that is in the area of automated deployments. There isn't really any scripting options to automate back end deployments of ETL Workflows from one environment to the next. Instead, everything has to be done via the GUI (Graphical User Interface). And while this is a relatively straight forward process, it can be time consuming.
- There is no real interface between PowerCenter and programs that manage the encapsulation of password for System Accounts. All connections to source and target systems have to be updated via the Power Center Workflow Manager GUI, and entered manually. There is no interface with encryption programs such as MAC VAULT, and as such, admins are required to have access to passwords that information security departments might otherwise not want them to have.
However, if you are looking for a tool to simply look at data from one source, there are other products out there. This is really designed for aggregating data and filtering/manipulating it into useful information.
PowerCenter for ETL
- Data migration from multiple sources. It handles any type of source data including RDBMS, flat files, XML, mainframe etc.
- Implementing data migration rules is very easy and efficient and reusable.
- Development and maintenance of code is very easy. Design, development and scheduling of full load and incremental load is very easy.
- Provides lot of features for developers to implement any kind of business rules.
- Not worth doing simple data migration using powercenter.
- Licensing cost.
- Handling Blobs and clobs data types.
One stop for all ETL needs
- Ability to work with different types of sources and targets.
- Version control of the code.
- Ability to integrate with LDAP for security.
- Ease of use.
- Performance with ODBC drivers can be improved.
- Memory utilization is very high during ETL execution.
- Improved scheduling capabilities for in built scheduler.
PowerCenter is a Industry Leading integration tool
- PowerCenter connects to multiple sources and targets with ease, and at the same time.
- PowerCenter allows the application of complicated business rule logic with ease through a user friendly interface.
- PowerCenter logic can be captured and shared in mapplets, enabling reusability.
- Documentation of how the different offerings in the Informatica product line work together as a total data platform, and licensing across those offerings. This information is difficult to come by and more often than not requires Informatica Professional Services assistance.
- Copy and paste of elements to provide additional documentation and coding capabilities.
- Interface with industry standards such as Microsoft Excel.
Integration with Informatica
- Tracking Changes in slowly changing dimensions
- Fact & multi-dimensional loading
- Integration of data from SAP and Salesforce is better with this ETL tool compared to other tools in the market
- Modularity
- Several partnerships diminishing the value of technologies
- Unable to get list of objects from Repository (like sources & targets) that don't have any dependency
- Scheduling: The built-in scheduling tool has many constraints such as handling Unix/VB scripts etc. Most enterprises use third party tools for this.
Modularity
Backward Compatability.
Easy Upgrade.
Support from Vendor
Informatica - Review at a glance
- Integration of data from SAP and Salesforce is better with this ETL tool compared to other tools in the market
- Tracking Changes in slowly changing dimensions
- Fact & multi-dimensional loading
- Modularity
- Scheduling: The built-in scheduling tool has many constraints such as handling Unix/VB scripts etc. Most enterprises use third party tools for this.
- Several partnerships diminishing the value of technologies
- Unable to get list of objects from Repository (like sources & targets) that don't have any dependency
- Extraction of the data from legacy sources (usually heterogenous)
- Data transformation (data optimized for transaction --> data optimized for analysis)
- Synchronization and cleansing of the data
- Loading the data into data warehouse.
A Powerful ETL Tool in Informatica PowerCenter
- Informatica is a complex, full-featured data integration tool. We use Informatica primarily to extract data from our ERP systems, transform, and load it into operational and dimensional data stores.
- We have found that upgrading from version to version can be a bit clunky and complex.
- Watch out for their audit department. They will hunt down development environments that are in use and try to back charge you for their repositories. Ensure your development environments are properly licensed.